# Multilingual Healthcare
Drmedra4b GGUF
Apache-2.0
Quantized version of the pre-trained model drwlf/DrMedra4B, suitable for medical AI-related tasks
Large Language Model
Transformers Supports Multiple Languages

D
mradermacher
185
0
Mmedlm
Apache-2.0
MMedLM is a multilingual healthcare foundation model with 7 billion parameters, based on the InternLM architecture, pretrained on the comprehensive multilingual medical corpus MMedC.
Large Language Model
Transformers Supports Multiple Languages

M
Henrychur
30
5
Featured Recommended AI Models